之前的文章講解的都是架設在後端,並讓前端接上後端使用相關的功能,本篇文章會介紹在前端架設的話要怎麼做,以及可以達到的效果
首先我們需要引入程式庫,一樣會是 Microsoft 提供的 Semantic Kernel 的程式庫
implementation 'com.microsoft.semantic-kernel:semantickernel-api:1.2.2'
implementation 'com.microsoft.semantic-kernel:semantickernel-aiservices-openai:1.2.2'
接著要添加這些東西
android{
packagingOptions {
resources {
excludes += [
'META-INF/INDEX.LIST',
'META-INF/io.netty.versions.properties',
'META-INF/DEPENDENCIES',
'META-INF/LICENSE',
'META-INF/LICENSE.txt',
'META-INF/license.txt',
'META-INF/NOTICE',
'META-INF/NOTICE.txt',
'META-INF/notice.txt',
'META-INF/*.kotlin_module'
]
}
}
}
一樣以官方的範例為例
public class LightPlugin {
// Mock data for the lights
private final Map<Integer, LightModel> lights = new HashMap<>();
public LightPlugin() {
lights.put(1, new LightModel(1, "Table Lamp", false));
lights.put(2, new LightModel(2, "Porch light", false));
lights.put(3, new LightModel(3, "Chandelier", true));
}
@DefineKernelFunction(name = "get_lights", description = "Gets a list of lights and their current state")
public List<LightModel> getLights() {
System.out.println("Getting lights");
return new ArrayList<>(lights.values());
}
@DefineKernelFunction(name = "change_state", description = "Changes the state of the light")
public LightModel changeState(
@KernelFunctionParameter(name = "id", description = "The ID of the light to change") int id,
@KernelFunctionParameter(name = "isOn", description = "The new state of the light") boolean isOn) {
System.out.println("Changing light " + id + " " + isOn);
if (!lights.containsKey(id)) {
throw new IllegalArgumentException("Light not found");
}
lights.get(id).setIsOn(isOn);
return lights.get(id);
}
}
public class LightModel {
private final int id;
private final String name;
private boolean isOn;
public LightModel(int id, String name, boolean isOn) {
this.id = id;
this.name = name;
this.isOn = isOn;
}
public int getId() {
return id;
}
public String getName() {
return name;
}
public boolean isOn() {
return isOn;
}
public void setIsOn(boolean isOn) {
this.isOn = isOn;
}
}
像前幾篇文章提到的一樣建立好相關的內容就差不多囉~
// Semantic Kernel
private final String modelId = "gpt-4o-mini";
private final String OPEN_AI_KEY = "API Key";
private OpenAIAsyncClient client;
private ChatCompletionService chatCompletionService;
private Kernel kernel;
private InvocationContext invocationContext;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
setSemanticKernel();
}
private void setSemanticKernel() {
client = new OpenAIClientBuilder()
.credential(new AzureKeyCredential(OPEN_AI_KEY))
.buildAsyncClient();
chatCompletionService = OpenAIChatCompletion.builder()
.withOpenAIAsyncClient(client)
.withModelId(modelId)
.build();
KernelPlugin lightPlugin = KernelPluginFactory.createFromObject(new LightPlugin(),
"LightsPlugin");
kernel = Kernel.builder()
.withAIService(ChatCompletionService.class, chatCompletionService)
.withPlugin(lightPlugin)
.build();
ContextVariableTypes
.addGlobalConverter(
ContextVariableTypeConverter.builder(LightModel.class)
.toPromptString(new Gson()::toJson)
.build());
invocationContext = new InvocationContext.Builder()
.withReturnMode(InvocationReturnMode.LAST_MESSAGE_ONLY)
.withToolCallBehavior(ToolCallBehavior.allowAllKernelFunctions(true))
.build();
}
下篇文章會針對這次建立的內容進行測試,之後會探討更多運用的場景